Extract structured data from remote or local LLM models. Predictable output is essential for any serious use of LLMs.
Extract data into Pydantic objects, dataclasses or simple types.
Same API for local file models and remote OpenAI, Mistral AI and other models.
Model management: download models, manage configuration, quickly switch between models.
Tools for evaluating output across local/remote models, for chat-like interaction and more.
No matter how well you craft a prompt begging a model for the output you need, it can always respond something else. Extracting structured data can be a big step into getting predictable behavior from your models.